1,628 research outputs found

    Advanced VLBI Imaging

    Get PDF
    Very Long Baseline Interferometry (VLBI) is an observational technique developed in astronomy for combining multiple radio telescopes into a single virtual instrument with an effective aperture reaching up to many thousand kilometers and enabling measurements at highest angular resolutions. The celebrated examples of applying VLBI to astrophysical studies include detailed, high-resolution images of the innermost parts of relativistic outflows (jets) in active galactic nuclei (AGN) and recent pioneering observations of the shadows of supermassive black holes (SMBH) in the center of our Galaxy and in the galaxy M87. Despite these and many other proven successes of VLBI, analysis and imaging of VLBI data still remain difficult, owing in part to the fact that VLBI imaging inherently constitutes an ill-posed inverse problem. Historically, this problem has been addressed in radio interferometry by the CLEAN algorithm, a matching-pursuit inverse modeling method developed in the early 1970-s and since then established as a de-facto standard approach for imaging VLBI data. In recent years, the constantly increasing demand for improving quality and fidelity of interferometric image reconstruction has resulted in several attempts to employ new approaches, such as forward modeling and Bayesian estimation, for application to VLBI imaging. While the current state-of-the-art forward modeling and Bayesian techniques may outperform CLEAN in terms of accuracy, resolution, robustness, and adaptability, they also tend to require more complex structure and longer computation times, and rely on extensive finetuning of a larger number of non-trivial hyperparameters. This leaves an ample room for further searches for potentially more effective imaging approaches and provides the main motivation for this dissertation and its particular focusing on the need to unify algorithmic frameworks and to study VLBI imaging from the perspective of inverse problems in general. In pursuit of this goal, and based on an extensive qualitative comparison of the existing methods, this dissertation comprises the development, testing, and first implementations of two novel concepts for improved interferometric image reconstruction. The concepts combine the known benefits of current forward modeling techniques, develop more automatic and less supervised algorithms for image reconstruction, and realize them within two different frameworks. The first framework unites multiscale imaging algorithms in the spirit of compressive sensing with a dictionary adapted to the uv-coverage and its defects (DoG-HiT, DoB-CLEAN). We extend this approach to dynamical imaging and polarimetric imaging. The core components of this framework are realized in a multidisciplinary and multipurpose software MrBeam, developed as part of this dissertation. The second framework employs a multiobjective genetic evolutionary algorithm (MOEA/D) for the purpose of achieving fully unsupervised image reconstruction and hyperparameter optimization. These new methods are shown to outperform the existing methods in various metrics such as angular resolution, structural sensitivity, and degree of supervision. We demonstrate the great potential of these new techniques with selected applications to frontline VLBI observations of AGN jets and SMBH. In addition to improving the quality and robustness of image reconstruction, DoG-HiT, DoB-CLEAN and MOEA/D also provide such novel capabilities as dynamic reconstruction of polarimetric images on minute time-scales, or near-real time and unsupervised data analysis (useful in particular for application to large imaging surveys). The techniques and software developed in this dissertation are of interest for a wider range of inverse problems as well. This includes such versatile fields such as Ly-alpha tomography (where we improve estimates of the thermal state of the intergalactic medium), the cosmographic search for dark matter (where we improve forecasted bounds on ultralight dilatons), medical imaging, and solar spectroscopy

    Does trade integration alter monetary policy transmission?

    Get PDF
    This paper explores the role of trade integration—or openness—for monetary policy transmission in a medium-scale New Keynesian model. Allowing for strategic complementarities in price-setting, we highlight a new dimension of the exchange rate channel by which monetary policy directly impacts domestic inflation. Although the strength of this effect increases with economic openness, it also requires that import prices respond to exchange rate changes. In this case domestic producers find it optimal to adjust their prices to exchange rate changes which alter the domestic currency price of their foreign competitors. We pin down key parameters of the model by matching impulse responses obtained from a vector autoregression on U.S. time series relative to an aggregate of industrialized countries. While we find evidence for strong complementarities, exchange rate pass-through is limited. Openness has therefore little bearing on monetary transmission in the estimated model

    Multi-scale and Multi-directional VLBI Imaging with CLEAN

    Full text link
    Very long baseline interferometry (VLBI) is a radio-astronomical technique in which the correlated signal from various baselines is combined into an image of highest angular resolution. Due to sparsity of the measurements, this imaging procedure constitutes an ill-posed inverse problem. For decades the CLEAN algorithm was the standard choice in VLBI studies, although having some serious disadvantages and pathologies that are challenged by the requirements of modern frontline VLBI applications. We develop a novel multi-scale CLEAN deconvolution method (DoB-CLEAN) based on continuous wavelet transforms that address several pathologies in CLEAN imaging. We benchmark this novel algorithm against CLEAN reconstructions on synthetic data and reanalyze BL Lac observations of RadioAstron with DoB-CLEAN. DoB-CLEAN approaches the image by multi-scalar and multi-directional wavelet dictionaries. Two different dictionaries are used. Firstly, a difference of elliptical spherical Bessel functions dictionary fitted to the uv-coverage of the observation that is used to sparsely represent the features in the dirty image. Secondly, a difference of elliptical Gaussian wavelet dictionary that is well suited to represent relevant image features cleanly. The deconvolution is performed by switching between the dictionaries. DoB-CLEAN achieves super-resolution compared to CLEAN and remedies the spurious regularization properties of CLEAN. In contrast to CLEAN, the representation by basis functions has a physical meaning. Hence, the computed deconvolved image still fits the observed visibilities, opposed to CLEAN. State-of-the-art multi-scalar imaging approaches seem to outperform single-scalar standard approaches in VLBI and are well suited to maximize the extraction of information in ongoing frontline VLBI applications.Comment: Accepted for publication in A&

    Dynamic and polarimetric VLBI imaging with a multiscalar approach

    Full text link
    Recently multiscale imaging approaches such as DoG-HiT were developed to solve the VLBI imaging problem and showed a promising performance: they are fast, accurate, unbiased and automatic. We extend the multiscalar imaging approach to polarimetric imaging, reconstructions of dynamically evolving sources and finally to dynamic polarimetric reconstructions. These extensions (mr-support imaging) utilize a multiscalar approach. The time-averaged Stokes I image is decomposed by a wavelet transform into single subbands. We use the set of statistically significant wavelet coefficients, the multiresolution support, computed by DoG-HiT as a prior in a constrained minimization manner: we fit the single-frame (polarimetric) observables by only varying the coefficients in the multiresolution support. The EHT is a VLBI array imaging supermassive black holes. We demonstrate on synthetic data that mr-support imaging offers ample regularization and is able to recover simple geometric dynamics at the horizon scale in a typical EHT setup. The approach is relatively lightweight, fast and largely automatic and data driven. The ngEHT is a planned extension of the EHT designed to recover movies at the event horizon scales of a supermassive black hole. We benchmark the performance of mr-support imaging for the denser ngEHT configuration demonstrating the major improvements the additional ngEHT antennas will bring to dynamic, polarimetric reconstructions. Current and upcoming instruments offer the observational possibility to do polarimetric imaging of dynamically evolving structural patterns with highest spatial and temporal resolution. State-of-the-art dynamic reconstruction methods can capture this motion with a range of temporal regularizers and priors. With this work, we add an additional, simpler regularizer to the list: constraining the reconstruction to the multiresolution support.Comment: accepted for publication in A&

    Formation of Chain-Folded Structures from Supercooled Polymer Melts

    Full text link
    The formation of chain-folded structures from the melt is observed in molecular dynamics simulations resembling the lamellae of polymer crystals. Crystallization and subsequent melting temperatures are related linearly to the inverse lamellar thickness. Analysis of the single chain conformations in the crystal shows that most chains reenter the same lamella by tight backfolds. Simulations are performed with a mesoscopic bead-spring model including a specific angle bending potential. They demonstrate that chain stiffness alone, without an attractive inter-particle potential, is a sufficient driving force for the formation of chain-folded lamellae.Comment: 4 pages, 5 figure

    Turn-taking: From perception to speech preparation

    Get PDF
    Wesselmeier H, Müller HM. Turn-taking: From perception to speech preparation. Neuroscience Letters. 2015;609:147-151.We investigated the preparation of a spoken answer response to interrogative sentences by measuring response time (RT) and the response-related readiness potential (RP). By comparing the RT and RP results we aimed to identify whether the RP-onset is more related to the actual speech preparation process or the pure intention to speak after turn-anticipation. Additionally, we investigated if the RP-onset can be influenced by the syntactic structure (one or two completion points). Therefore, the EEG data were sorted based on two variables: the cognitive load required for the response and the syntactic structure of the stimulus questions. The results of the response utterance preparation associated event-related potential (ERP) and the RT suggest that the RP-onset is more related to the actual speech preparation process rather than the pure intention to speak after turn-anticipation. However, the RP-onset can be influenced by the syntactic structure of the question leading to an early response preparation

    A new comparative approach to macroeconomic modeling and policy analysis

    Get PDF
    In the aftermath of the global financial crisis, the state of macroeconomic modeling and the use of macroeconomic models in policy analysis has come under heavy criticism. Macroeconomists in academia and policy institutions have been blamed for relying too much on a particular class of macroeconomic models. This paper proposes a comparative approach to macroeconomic policy analysis that is open to competing modeling paradigms. Macroeconomic model comparison projects have helped produce some very influential insights such as the Taylor rule. However, they have been infrequent and costly, because they require the input of many teams of researchers and multiple meetings to obtain a limited set of comparative findings. This paper provides a new approach that enables individual researchers to conduct model comparisons easily, frequently, at low cost and on a large scale. Using this approach a model archive is built that includes many well-known empirically estimated models that may be used for quantitative analysis of monetary and fiscal stabilization policies. A computational platform is created that allows straightforward comparisons of models’ implications. Its application is illustrated by comparing different monetary and fiscal policies across selected models. Researchers can easily include new models in the data base and compare the effects of novel extensions to established benchmarks thereby fostering a comparative instead of insular approach to model development
    corecore